Decommutate recorded data
You have some data recorded and now you’d like to decommutate that and work with the values inside. How does that work?
Data Flow
First we need a Data Flow with a reader, and we’re going to need a Push Reader why push reader?.
Now we’ll be able to write data to the stream (indicated by the UNC) above. Let’s connect the rest of our Data Flow components.
In the Decommutator, choose your map with the drop-down, or edit/create the map with the Edit… button.
We’ve now got the bare bones needed to receive the played back data.
Don’t forget to start the Data Flow so our stream shows up.
Player
Now let’s open Player and click the Recordings Browser button so we can select the recording. Choose the recording and click Select. The Player will populate with information from the recording.
In this example the Destination stream doesn’t exist, so we see a warning. We’re going to set this field to our Data Flow from earlier.
Paced Playback
If your recording is in Raw format, there’s no way to pace it in this setup and the data will run as fast as the server can process it. If you need to pace a raw stream, the data must be looped back through a serial channel.
If your data is Chapter 10 formatted, you must decide whether or not to pace the data. Deciding whether to pace depends on what you intend to do with the data and if the system is actively doing other work that shouldn’t be affected.
Pacing will cause the Player to send the data out based on the original timestamp intervals, so you’ll get data coming at about the rate it was recorded. That might be useful if the downstream use of the data needs or expects the data to be coming in normal-time.
But if your use case is to analyze the data, then you might want to do it as fast as the system can run. For instance, if you are looking for peaks in data, or looking for events within the data, letting the data run without limits will complete much faster.
Play
The Player and Data Flow are now connected, hit the Play button and the data will begin arriving in the Data Flow.